fake nude image
- North America > United States > New Jersey (0.06)
- North America > United States > Iowa (0.04)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- Europe > United Kingdom (0.04)
- Leisure & Entertainment > Sports (1.00)
- Information Technology > Security & Privacy (1.00)
- Health & Medicine (1.00)
- (4 more...)
We're Completely Unprepared for the Deepfake Porn Boom
Last week, A.I.–generated nude images of pop superstar Taylor Swift were produced and distributed without her consent. They circulated throughout the internet, with one single post on X (née Twitter) garnering 45 million views before the site took it down. Deepfakes, as they've come to be called in recent years, often target female celebrities, but with the rise of A.I., it's easier than ever for everyday people (almost always women) to be targeted. Last year, more than 143,000 deepfake porn videos were created, according to one estimate from the independent researcher Genevieve Oh, more than every other previous year combined. That number will, in all likelihood, only continue to rise.
- Leisure & Entertainment (1.00)
- Law (1.00)
- Information Technology > Security & Privacy (0.94)
- Media > Music (0.92)
Disturbing app can create nude images of ANY woman
A disturbing app has been developed which uses artificial intelligence and algorithms to produce fake nude images of women. The app, called DeepNude, removes all clothing from any uploaded image of a woman - sparking fears it could be used to blackmail unsuspecting victims with fake revenge porn threats. Since the app came to light, it has been taken offline, claiming it'cannot cope' with the volume of interest. The anonymous developers said they would be back within days and just needed'to fix some bugs and catch our breath'. In the free version of the app, the output images are partially covered with a large watermark.
- Information Technology > Security & Privacy (0.55)
- Law > Criminal Law (0.36)